Asymptotically minimax regret by Bayes mixtures - Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on

نویسنده

  • Jun-ichi Takeuchi
چکیده

We study the problem of data compression, gambling and prediction of a sequence zn = z1z2 ... z, from a certain alphabet X , in terms of regret [4] and redundancy with respect to a general exponential family, a general smooth family, and also Markov sources. In particular, we show that variants of Jeffreys mixture asymptotically achieve their minimax values. These results are generalizations of the work by Xie and Barron [5, 61 in the general smooth families. In particular for one-dimensional exponential families, they also extend the works of Clarke and Barron [I] to deal with the full natural parameter space rather than compact sets interior to it. The worst case regret of a probability density q with respect to a d-dimensional family of probability densities S = { p ( l 9 ) : 9 E 0) and a set of the sequences W, E X" is defined as

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robustly Minimax Codes for Universal Data Compression

We introduce a notion of ‘relative redundancy’ for universal data compression and propose a universal code which asymptotically achieves the minimax value of the relative redundancy. The relative redundancy is a hybrid of redundancy and coding regret (pointwise redundancy), where a class of information sources and a class of codes are assumed. The minimax code for relative redundancy is an exte...

متن کامل

Asymptotic minimax regret for data compression, gambling, and prediction

For problems of data compression, gambling, and prediction of individual sequences 1 the following questions arise. Given a target family of probability mass functions ( 1 ), how do we choose a probability mass function ( 1 ) so that it approximately minimizes the maximum regret /belowdisplayskip10ptminus6pt max (log 1 ( 1 ) log 1 ( 1 )̂) and so that it achieves the best constant in the asymptot...

متن کامل

Capacity Definitions and Coding Strategies for General Channels with Receiver Side Information - Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on

We consider three capacity definitions for a channel with channel side information at t h e receiver. T h e capacity is the highest rate asymptotically achievable. T h e outage capacity is t h e highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is t h e highest expected rate asymptotically achievable using a single encoder a n d ...

متن کامل

Asymptotically minimax regret for exponential families

We study the problem of data compression, gambling and prediction of a sequence x = x1x2...xn from a certain alphabet X , in terms of regret and redundancy with respect to a general exponential family. In particular, we evaluate the regret of the Bayes mixture density and show that it asymptotically achieves their minimax values when variants of Jeffreys prior are used. Keywords— universal codi...

متن کامل

Sequential Prediction of Individual Sequences Under General Loss Functions

We consider adaptive sequential prediction of arbitrary binary sequences when the performance is evaluated using a general loss function. The goal is to predict on each individual sequence nearly as well as the best prediction strategy in a given comparison class of (possibly adaptive) prediction strategies, called experts. By using a general loss function, we generalize previous work on univer...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004